The Leave-One-Out Kernel
نویسندگان
چکیده
Recently, several attempts have been made for deriving datadependent kernels from distribution estimates with parametric models (e.g. the Fisher kernel). In this paper, we propose a new kernel derived from any distribution estimators, parametric or nonparametric. This kernel is called the Leave-one-out kernel (i.e. LOO kernel), because the leave-one-out process plays an important role to compute this kernel. We will show that, when applied to a parametric model, the LOO kernel converges to the Fisher kernel asymptotically as the number of samples goes to infinity.
منابع مشابه
Feature Scaling for Kernel Fisher Discriminant Analysis Using Leave-One-Out Cross Validation
Kernel fisher discriminant analysis (KFD) is a successful approach to classification. It is well known that the key challenge in KFD lies in the selection of free parameters such as kernel parameters and regularization parameters. Here we focus on the feature-scaling kernel where each feature individually associates with a scaling factor. A novel algorithm, named FS-KFD, is developed to tune th...
متن کاملOptimally regularised kernel Fisher discriminant classification
Mika, Rätsch, Weston, Schölkopf and Müller [Mika, S., Rätsch, G., Weston, J., Schölkopf, B., & Müller, K.-R. (1999). Fisher discriminant analysis with kernels. In Neural networks for signal processing: Vol. IX (pp. 41-48). New York: IEEE Press] introduce a non-linear formulation of Fisher's linear discriminant, based on the now familiar "kernel trick", demonstrating state-of-the-art performance...
متن کاملE cient leave-one-out cross-validation of kernel Fisher discriminant classi'ers
Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–48) apply the “kernel trick” to obtain a non-linear variant of Fisher’s linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark data sets. We show that leave-one-out cross-validation of kernel Fisher discriminant classi'ers can be implemented with a ...
متن کاملFast Kernel Classifier Construction Using Orthogonal Forward Selection to Minimise Leave-One-Out Misclassification Rate
We propose a simple yet computationally efficient construction algorithm for two-class kernel classifiers. In order to optimise classifier’s generalisation capability, an orthogonal forward selection procedure is used to select kernels one by one by minimising the leave-one-out (LOO) misclassification rate directly. It is shown that the computation of the LOO misclassification rate is very effi...
متن کاملLeave-One-Out Support Vector Machines
We present a new learning algorithm for pattern recognition inspired by a recent upper bound on leave-one-out error [Jaakkola and Haussler, 1999] proved for Support Vector Machines {SVMs) [Vapnik, 1995; 1998]. The new approach directly minimizes the expression given by the bound in an attempt to minimize leave-one-out error. This gives a convex optimization problem which constructs a sparse lin...
متن کامل